Cascading Asymmetric Linear Classifiers

نویسنده

  • Luis Perez-Breva
چکیده

Motivation: Combinations of classifiers have been found useful empirically, yet there is no formal proof of their generalization ability. Our goal is to develop an algorithm to train a sequence of linear classifiers yielding a nonlinear decision surface. We believe that choosing asymmetric regularization parameters for each class can yield a sequence of classifiers that approximates arbitrarily the bayes error, modulo the density of the sampling.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

MultiStage Cascading of Multiple Classifiers: One Man's Noise is Another Man's Data

For building implementable and industryvaluable classification solutions, machine learning methods must focus not only on accuracy but also on computational and space complexity. We discuss a multistage method, namely cascading, where there is a sequence of classifiers ordered in terms of increasing complexity and specificity such that early classifiers are simple and general whereas later ones...

متن کامل

An Evolutionary Approach To Cascade Multiple Classifiers: A Case-Study To Analyze Textual Content Of Medical Records And Identify Potential Diagnosis

This paper describes an experiment where classifiers are used to identify potential diagnoses on examining textual content of medical records. Three classifiers are applied separately (k-nearest neighborhood, multilayer perceptron and support vector machines) and also combined in two different approaches (parallel and cascading); results show that even accuracy point to a best alternative, ROC ...

متن کامل

Cascading Classifiers for Named Entity Recognition in Clinical Notes

Clinical named entities convey great deal of knowledge in clinical notes. This paper investigates named entity recognition from clinical notes using machine learning approaches. We present a cascading system that uses a Conditional Random Fields model, a Support Vector Machine and a Maximum Entropy to reclassify the identified entities in order to reduce misclassification. Voting strategy was e...

متن کامل

Using Asymmetric Distributions to Improve Classifier Probabilities: A Comparison of New and Standard Parametric Methods

For many discriminative classifiers, it is desirable to convert an unnormalized confidence score output from the classifier to a normalized probability estimate. Such a method can also be used for creating better estimates from a probabilistic classifier that outputs poor estimates. Typical parametric methods have an underlying assumption that the score distribution for a class is symmetric; we...

متن کامل

Maximum Margin Classifiers with Specified False Positive and False Negative Error Rates

This paper addresses the problem of maximum margin classification given the moments of class conditional densities and the false positive and false negative error rates. Using Chebyshev inequalities, the problem can be posed as a second order cone programming problem. The dual of the formulation leads to a geometric optimization problem, that of computing the distance between two ellipsoids, wh...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002